A generalization of the Jensen divergence: The chord gap divergence
نویسنده
چکیده
We introduce a novel family of distances, called the chord gap divergences, that generalizes the Jensen divergences (also called the Burbea-Rao distances), and study its properties. It follows a generalization of the celebrated statistical Bhattacharyya distance that is frequently met in applications. We report an iterative concave-convex procedure for computing centroids, and analyze the performance of the k-means++ clustering with respect to that new dissimilarity measure by introducing the Taylor-Lagrange remainder form of the skew Jensen divergences.
منابع مشابه
On Unified Generalizations of Relative Jensen–shannon and Arithmetic–geometric Divergence Measures, and Their Properties Pranesh Kumar and Inder Jeet Taneja
Abstract. In this paper we shall consider one parametric generalization of some nonsymmetric divergence measures. The non-symmetric divergence measures are such as: Kullback-Leibler relative information, χ2−divergence, relative J – divergence, relative Jensen – Shannon divergence and relative Arithmetic – Geometric divergence. All the generalizations considered can be written as particular case...
متن کاملNew Jensen and Ostrowski Type Inequalities for General Lebesgue Integral with Applications
Some new inequalities related to Jensen and Ostrowski inequalities for general Lebesgue integral are obtained. Applications for $f$-divergence measure are provided as well.
متن کاملGeneralizing Jensen and Bregman divergences with comparative convexity and the statistical Bhattacharyya distances with comparable means
Comparative convexity is a generalization of convexity relying on abstract notions of means. We define the (skew) Jensen divergence and the Jensen diversity from the viewpoint of comparative convexity, and show how to obtain the generalized Bregman divergences as limit cases of skewed Jensen divergences. In particular, we report explicit formula of these generalized Bregman divergences when con...
متن کاملProperties of Classical and Quantum Jensen-Shannon Divergence
Jensen-Shannon divergence (JD) is a symmetrized and smoothed version of the most important divergence measure of information theory, Kullback divergence. As opposed to Kullback divergence it determines in a very direct way a metric; indeed, it is the square of a metric. We consider a family of divergence measures (JDα for α > 0), the Jensen divergences of order α, which generalize JD as JD1 = J...
متن کاملA note on decision making in medical investigations using new divergence measures for intuitionistic fuzzy sets
Srivastava and Maheshwari (Iranian Journal of Fuzzy Systems 13(1)(2016) 25-44) introduced a new divergence measure for intuitionisticfuzzy sets (IFSs). The properties of the proposed divergence measurewere studied and the efficiency of the proposed divergence measurein the context of medical diagnosis was also demonstrated. In thisnote, we point out some errors in ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1709.10498 شماره
صفحات -
تاریخ انتشار 2017